Optimal regression rates for SVMs using Gaussian kernels

نویسندگان

  • Mona Eberts
  • Ingo Steinwart
چکیده

Support vector machines (SVMs) using Gaussian kernels are one of the standard and state-of-the-art learning algorithms. In this work, we establish new oracle inequalities for such SVMs when applied to either least squares or conditional quantile regression. With the help of these oracle inequalities we then derive learning rates that are (essentially) minmax optimal under standard smoothness assumptions on the target function. We further utilize the oracle inequalities to show that these learning rates can be adaptively achieved by a simple data-dependent parameter selection method that splits the data set into a training and a validation set. AMS 2000 subject classifications: Primary 62G08; secondary 62G05, 68Q32, 68T05.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal learning rates for least squares SVMs using Gaussian kernels

We prove a new oracle inequality for support vector machines with Gaussian RBF kernels solving the regularized least squares regression problem. To this end, we apply the modulus of smoothness. With the help of the new oracle inequality we then derive learning rates that can also be achieved by a simple data-dependent parameter selection method. Finally, it turns out that our learning rates are...

متن کامل

Learning Theory Estimates with Observations from General Stationary Stochastic Processes

This letter investigates the supervised learning problem with observations drawn from certain general stationary stochastic processes. Here by general, we mean that many stationary stochastic processes can be included. We show that when the stochastic processes satisfy a generalized Bernstein-type inequality, a unified treatment on analyzing the learning schemes with various mixing processes ca...

متن کامل

Optimal Learning Rates for Localized SVMs

One of the limiting factors of using support vector machines (SVMs) in large scale applications are their super-linear computational requirements in terms of the number of training samples. To address this issue, several approaches that train SVMs on many small chunks separately have been proposed in the literature. With the exception of random chunks, which is also known as divide-and-conquer ...

متن کامل

Predicting the Nonlinear Dynamics of Biological Neurons using Support Vector Machines with Di erent Kernels

Based on biological data we examine the ability of Sup port Vector Machines SVMs with gaussian polyno mial and tanh kernels to learn and predict the nonlin ear dynamics of single biological neurons We show that SVMs for regression learn the dynamics of the pyloric dilator neuron of the australian cray sh and we deter mine the optimal SVM parameters with regard to the test error Compared to conv...

متن کامل

Fast Rates for Support Vector Machines using Gaussian Kernels∗†‡

We establish learning rates up to the order of n−1 for support vector machines with hinge loss (L1-SVMs) and nontrivial distributions. For the stochastic analysis of these algorithms we use recently developed concepts such as Tsybakov’s noise assumption and local Rademacher averages. Furthermore we introduce a new geometric noise condition for distributions that is used to bound the approximati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012